Computer ethics is a part of practical philosophy concerned with how computing professionals should make decisions regarding professional and social conduct. Margaret Anne Pierce, a professor in the Department of Mathematics and Computers at Georgia Southern University has categorized the ethical decisions related to computer technology and usage into three primary influences:
A bit later during the same year, the world's first computer crime was committed. A programmer was able to use a bit of computer code to stop his banking account from being flagged as overdrawn. However, there were no laws in place at that time to stop him, and as a result he was not charged. To make sure another person did not follow suit, an ethics code for computers was needed.
In 1973, the Association for Computing Machinery (ACM) adopted its first code of ethics. SRI International's Donn Parker, an author on computer crimes, led the committee that developed the code.
In 1976, medical teacher and researcher Walter Maner noticed that ethical decisions are much harder to make when computers are added. He noticed a need for a different branch of ethics for when it came to dealing with computers. The term "computer ethics" was thus invented.
In 1976 Joseph Weizenbaum made his second significant addition to the field of computer ethics. He published a book titled Computer Power and Human Reason, which talked about how artificial intelligence is good for the world; however it should never be allowed to make the most important decisions as it does not have human qualities such as wisdom. By far the most important point he makes in the book is the distinction between choosing and deciding. He argued that deciding is a computational activity while making choices is not and thus the ability to make choices is what makes us humans.
At a later time during the same year Abbe Mowshowitz, a professor of Computer Science at the City College of New York, published an article titled "On approaches to the study of social issues in computing." This article identified and analyzed technical and non-technical biases in research on social issues present in computing.
During 1978, the Right to Financial Privacy Act was adopted by the United States Congress, drastically limiting the government's ability to search bank records.
During the next year Terrell Ward Bynum, the professor of philosophy at Southern Connecticut State University as well as Director of the Research Center on Computing and Society there, developed curriculum for a university course on computer ethics. Bynum was also editor of the journal Metaphilosophy. In 1983 the journal held an essay contest on the topic of computer ethics and published the winning essays in its best-selling 1985 special issue, “Computers and Ethics.”
In 1984, the United States Congress passed the Small Business Computer Security and Education Act, which created a Small Business Administration advisory council to focus on computer security related to small businesses.
In 1985, James H. Moor, professor of philosophy at Dartmouth College in New Hampshire, published an essay called "What is Computer Ethics?" In this essay Moor states the computer ethics includes the following: "(1) identification of computer-generated policy vacuums, (2) clarification of conceptual muddles, (3) formulation of policies for the use of computer technology, and (4) ethical justification of such policies."
During the same year, Deborah G. Johnson, professor of Applied Ethics and chair of the Department of Science, Technology, and Society in the School of Engineering and Applied Sciences of the University of Virginia, got the first major computer ethics textbook published. Johnson's textbook identified major issues for research in computer ethics for more than 10 years after publication of the first edition.
In 1988, Robert Hauptman, a librarian at St. Cloud University, came up with "information ethics", a term that was used to describe the storage, production, access and dissemination of information. Near the same time, the Computer Matching and Privacy Act was adopted and this act restricted United States government programs identifying debtors.
In the year 1992, ACM adopted a new set of ethical rules called "ACM code of Ethics and Professional Conduct" which consisted of 24 statements of personal responsibility.
Three years later, in 1995, Krystyna Górniak-Kocikowska, a professor of philosophy at Southern Connecticut State University, Coordinator of the Religious Studies Program, as well as a senior research associate in the Research Center on Computing and Society, came up with the idea that computer ethics will eventually become a global ethical system and soon after, computer ethics would replace ethics altogether as it would become the standard ethics of the information age.
In 1999, Deborah Johnson revealed her view, which was quite contrary to Górniak-Kocikowska's belief, and stated that computer ethics will not evolve but rather be our old ethics with a slight twist.
Post 20th century, as a result to much debate of ethical guidelines, many organizations such as ABET offer ethical accreditation to University or College applications such as "Applied and Natural Science, Computing, Engineering and Engineering Technology at the associate, bachelor, and master levels" to try and promote quality works that follow sound ethical and moral guidelines.
Ethical considerations have been linked to the Internet of Things (IoT) with many physical devices being connected to the internet.
Cryptocurrency in regards to the balance of the current purchasing relationship between the buyer and seller.
Autonomous technology such as self-driving cars forced to make human decisions. There is also concern over how autonomous vehicles would behave in different countries with different culture values.
Security risks have been identified with Cloud computing with every user interaction being sent and analyzed to central computing hubs. Artificial intelligence devices like the Amazon Alexa and Google Home are collecting personal data from users while at home and uploading it to the cloud. Apple's Siri and Microsoft's Cortana smartphone assistants are collecting user information, analyzing the information, and then sending the information back to the user.
A whole industry of privacy and ethical tools has grown over time, giving people the choice to not share their data online. These are often open source software, which allows the users to ensure that their data is not saved to be used without their consent.
Concerns
Internet privacy
Artificial intelligence
The effects of Infringing copying
Ethical standards
See also
Further reading
External links
|
|